Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 10.0 years
5 - 10 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Senior Data Engineer Key Responsibilities As a Senior Data Engineer, you will: Data Pipeline Development: Design, build, and maintain scalable data pipelines using PySpark and Python. AWS Cloud Integration: Work with AWS cloud services (S3, Lambda, Glue, EMR, Redshift) for data ingestion, processing, and storage. ETL Workflow Management: Implement and maintain ETL workflows using DBT and orchestration tools (e.g., Airflow). Data Warehousing: Design and manage data models in Snowflake, ensuring performance and reliability. SQL Optimization: Utilize SQL for querying and optimizing datasets across different databases. Data Integration: Integrate and manage data from MongoDB, Kafka, and other streaming or NoSQL sources. Collaboration & Support: Collaborate with data scientists, analysts, and other engineers to support advanced analytics and Machine Learning (ML) initiatives. Data Quality & Governance: Ensure data quality, lineage, and governance through best practices and tools. Mandatory Skills & Experience Strong programming skills in Python and PySpark . Hands-on experience with AWS data services (S3, Lambda, Glue, EMR, Redshift). Proficiency in SQL and experience with DBT for data transformation. Experience with Snowflake for data warehousing. Knowledge of MongoDB , Kafka , and data streaming concepts. Good understanding of data architecture, data modeling, and data governance . Familiarity with large-scale data platforms. Essential Professional Skills Excellent problem-solving skills . Ability to work independently or as part of a team . Experience with CI/CD and DevOps practices in a data engineering environment (Plus). Qualifications Proven hands-on experience working with large-scale data platforms . Strong background in Python, PySpark, AWS , and modern data warehousing tools such as Snowflake and DBT . Familiarity with NoSQL databases like MongoDB and real-time streaming platforms like Kafka.
Posted 6 days ago
3.0 - 9.0 years
3 - 9 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
This role is for an experienced Automation tester who understands the architecture overall, is able to understand the systems involved and the integration points etc. The candidate will be required to work closely with the Test lead and the development team to write the automation test scripts and identify the areas which are critical from the end to end testing perspective. Need a senior guy who has worked on medium to large scale projects and programs in past and hence will be able to add value to the overall program. Hands on experience in Snowflake preferably with SnowPro Core Certification. Hands on experience in AWS development. 3+ years of experience programming with Python or similar programming language. 4+ years of experience with SQL. Expertise in SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi, Snaplogic Experience of working in DevOps environment. Version control and Git workflows. Well versed with Data analysis, Data Warehousing and SDLC concepts. Experience in software delivery and agile methodologies Excellent software designing, programming, engineering and problem-solving skills Strong experience working on Data Ingestion, Transformation (Stored Procedures / JavaScripts) etc Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Ability, willingness & openness to experiment / evaluate / adopt new technologies Passion for technology, problem solving and team working Mandatory Skills Snowflake preferably with SnowPro Core Certification Python or similar programming language Sound knowledge of finance domain ETL / ELT tools like Nifi, Snaplogic DevOps environment AWS Development Data Warehousing and SDLC concepts
Posted 6 days ago
0.0 - 5.0 years
0 - 5 Lacs
Pune, Maharashtra, India
On-site
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour's. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks
Posted 6 days ago
6.0 - 9.0 years
2 - 11 Lacs
Pune, Maharashtra, India
On-site
Responsibilities: Review sales proposals, contract terms, and pricing models to ensure accurate deal construction and recommend pricing strategies that maximize revenue margin. Ensure deals comply with pricing policies, legal standards, and regulatory requirements. Partner with sales, finance, legal, and other partners to lead efficient deal analysis, negotiation, and execution. Monitor deal performance, identify trends, and provide relevant insights to improve the sales process. Use tools such as Salesforce and Snowflake to improve deal data. Analyze historical deal data to uncover patterns in discounting, concessions, and identify areas for improvement. Be a subject matter expert in pricing strategies, quoting protocols, and deal structuring best practices. What Youll Need to be Successful 6 or more years of experience reviewing sales proposals, contract terms, and pricing models to ensure accurate deal construction and recommend pricing strategies that maximize revenue margin. Proficient in Salesforce and Snowflake, with an understanding of data integration between systems and data sources. Advanced Excel skills (can perform complex functions) , including the ability to build pro formats and conduct profitability analyses. Multitask and prioritizing during high-volume periods; availability during end-of-month and end-of-quarter close cycles is required. Highly organized, detail-oriented, innovative, and customer-focused. Ability to assess the implications of negotiated clauses and operational requirements.
Posted 6 days ago
5.0 - 8.0 years
5 - 8 Lacs
Chennai, Tamil Nadu, India
On-site
Qualification Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS Redshift-specific features. Collaborate with stakeholders to understand the requirements of the data warehouse. Implement data security, privacy, and compliance measures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Stay current with new AWS/Snowflake services and features and recommend improvements to existing architecture. Design and implement scalable, secure, and cost-effective cloud solutions using AWS / Snowflake services. Collaborate with cross-functional teams to understand requirements and provide technical guidance.
Posted 6 days ago
5.0 - 8.0 years
5 - 8 Lacs
Chennai, Tamil Nadu, India
On-site
Qualification Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS Redshift-specific features. Collaborate with stakeholders to understand the requirements of the data warehouse. Implement data security, privacy, and compliance measures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Stay current with new AWS/Snowflake services and features and recommend improvements to existing architecture. Design and implement scalable, secure, and cost-effective cloud solutions using AWS / Snowflake services. Collaborate with cross-functional teams to understand requirements and provide technical guidance.
Posted 6 days ago
7.0 - 10.0 years
7 - 10 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Data Modeller JD: We are seeking a skilled Data Modeller to join our Corporate Banking team. The ideal candidate will have a strong background in creating data models for various banking services, including Current Account Savings Account (CASA), Loans, and Credit Services. This role involves collaborating with the Data Architect to define data model structures within a data mesh environment and coordinating with multiple departments to ensure cohesive data management practices. Data Modelling: Design and develop data models for CASA, Loan, and Credit Services, ensuring they meet business requirements and compliance standards. Create conceptual, logical, and physical data models that support the bank's strategic objectives. Ensure data models are optimized for performance, security, and scalability to support business operations and analytics. Collaboration with Data Architect: Work closely with the Data Architect to establish the overall data architecture strategy and framework. Contribute to the definition of data model structures within a data mesh environment. Data Quality and Governance: Ensure data quality and integrity in the data models by implementing best practices in data governance. Assist in the establishment of data management policies and standards. Conduct regular data audits and reviews to ensure data accuracy and consistency across systems. Data Modelling Tools: ERwin, IBM InfoSphere Data Architect, Oracle Data Modeler, Microsoft Visio, or similar tools. Databases: SQL, Oracle, MySQL, MS SQL Server, PostgreSQL, Neo4j Graph Data Warehousing Technologies: Snowflake, Teradata, or similar. ETL Tools: Informatica, Talend, Apache NiFi, Microsoft SSIS, or similar. Big Data Technologies: Hadoop, Spark (optional but preferred). Technologies: Experience with data modelling on cloud platforms Microsoft Azure (Synapse, Data Factory) Qualification Data Modeller JD: We are seeking a skilled Data Modeller to join our Corporate Banking team. The ideal candidate will have a strong background in creating data models for various banking services, including Current Account Savings Account (CASA), Loans, and Credit Services. This role involves collaborating with the Data Architect to define data model structures within a data mesh environment and coordinating with multiple departments to ensure cohesive data management practices. Data Modelling: Design and develop data models for CASA, Loan, and Credit Services, ensuring they meet business requirements and compliance standards. Create conceptual, logical, and physical data models that support the bank's strategic objectives. Ensure data models are optimized for performance, security, and scalability to support business operations and analytics. Collaboration with Data Architect: Work closely with the Data Architect to establish the overall data architecture strategy and framework. Contribute to the definition of data model structures within a data mesh environment. Data Quality and Governance: Ensure data quality and integrity in the data models by implementing best practices in data governance. Assist in the establishment of data management policies and standards. Conduct regular data audits and reviews to ensure data accuracy and consistency across systems. Data Modelling Tools: ERwin, IBM InfoSphere Data Architect, Oracle Data Modeler, Microsoft Visio, or similar tools. Databases: SQL, Oracle, MySQL, MS SQL Server, PostgreSQL, Neo4j Graph Data Warehousing Technologies: Snowflake, Teradata, or similar. ETL Tools: Informatica, Talend, Apache NiFi, Microsoft SSIS, or similar. Big Data Technologies: Hadoop, Spark (optional but preferred). Technologies: Experience with data modelling on cloud platforms Microsoft Azure (Synapse, Data Factory)
Posted 6 days ago
8.0 - 13.0 years
2 - 11 Lacs
Bengaluru / Bangalore, Karnataka, India
Remote
Working knowledge of MongoDB (6.0 or above). Experience with Sharding and Replica sets. Working knowledge of database installation, setup, creation, and maintenance processes. Working knowledge on Change Streams and Mongo ETLs to replicate live changes to downstream Analytics systems. Experience running MongoDB in containerized environment (EKS clusters) Support Reliability Engineering task for all other database platform (SQL, MYSQL, Postgres, Snowflake, Kafka). Experience with Cloud or Ops Manager (a plus) Understand Networking components on aws and gcp cloud. Technical knowledge of Backup/ Recovery. Disaster Recovery and High Availability techniques Strong technical knowledge in writing shell scripts used to support database administration. Good Understanding of Kafka and Snowflake Administration. Good Understanding of Debezium, Kafka, Zookeeper and Snowflake is plus. . Automate Database Routine tasks Independently with shell, python and other languages. #LI-Onsite 8+ years of experience in Managing MongoDB on-prime and Atlas Cloud Be an part of the database team in developing next-generation database systems. Provide services in administration and performance monitoring of database related systems. Develop system administration standards and procedures to maintain practices. Support backup and recovery strategies. Provide in the creative process to improving architectural designs and implement new architectures Expertise in delivering efficiency and cost effectiveness. Monitor and support capacity planning and analysis. Monitor performance, troubleshoot issues and proactively tune database and workloads. Sound knowledge Terraform, Grafana and Manage Infra as a code using Terraform & Gitlab. Ability to work remotely.
Posted 6 days ago
8.0 - 13.0 years
3 - 11 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
Remote
You will Provide 24/7 administrative support (on-prime and Atlas Cloud) on MongoDB Clusters, Postgres & Snowflake Provide support for on-prime and Confluent Cloud Kafka Clusters You will Review database designs to ensure all technical and our requirements are met. Perform database Optimization, testing to ensure Service level agreements are met. You will support during system implementation and in production Provide Support for Snowflake Administrative Tasks (Data Pipeline, Object creation, Access) Participate in Weekdays and Weekend Oncall Rotation to support Products running on Mongo, SQL, Kafka & Snowflake, and other RDBMS Systems. This roles does not have any managerial responsibilities. Its an individual contributor role. You will report to Sr. Manager Reliability Engineering. What Your Responsibilities Will Be 8+ years of experience in Managing MongoDB on-prime and Atlas Cloud Be an part of the database team in developing next-generation database systems. Provide services in administration and performance monitoring of database related systems. Develop system administration standards and procedures to maintain practices. Support backup and recovery strategies. Provide in the creative process to improving architectural designs and implement new architectures Expertise in delivering efficiency and cost effectiveness. Monitor and support capacity planning and analysis. Monitor performance, troubleshoot issues and proactively tune database and workloads. Sound knowledge Terraform, Grafana and Manage Infra as a code using Terraform & Gitlab. Ability to work remotely. What Youll Need to be Successful Working knowledge of MongoDB (6.0 or above). Experience with Sharding and Replica sets. Working knowledge of database installation, setup, creation, and maintenance processes. Working knowledge on Change Streams and Mongo ETLs to replicate live changes to downstream Analytics systems. Experience running MongoDB in containerized environment (EKS clusters) Support Reliability Engineering task for all other database platform (SQL, MYSQL, Postgres, Snowflake, Kafka). Experience with Cloud or Ops Manager (a plus) Understand Networking components on aws and gcp cloud. Technical knowledge of Backup/ Recovery. Disaster Recovery and High Availability techniques Strong technical knowledge in writing shell scripts used to support database administration. Good Understanding of Kafka and Snowflake Administration. Good Understanding of Debezium, Kafka, Zookeeper and Snowflake is plus
Posted 6 days ago
8.0 - 13.0 years
2 - 11 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
What Your Responsibilities Will Be Avalara is looking for data analytics engineer who can solve and scale real world big data challenges. Have end to end analytics experience and a complex data story with data models and reliable and applicable metrics. Build and deploy data science models using complex SQL, Python, DBT data modelling and re-useable visualization components (PowerBI/Tableau/Hex/R-shiny etc.) Expert level experience in PowerBI, SQL and Snowflake Solve needs on a large scale by applying your software engineering and complex data. Lead and help develop a roadmap for the area and the team. Analyze fault tolerance and high availability issues, performance, and scale challenges, and solve them. Lead programs and collaborate with engineers, product managers, and technical program managers across teams. Understand the trade-offs between consistency, durability, and costs to build solutions that can meet the demands of growing services. Ensure the operational readiness of the services and meet the commitments to our customers regarding availability and performance. Manage end-to-end project plans and ensure on-time delivery. Communicate the status and big picture to the project team and management. Work with business and engineering teams to identify scope, constraints, dependencies, and risks. Identify risks and opportunities across the business and guide solutions. What Youll Need to be Successful What Youll Need to be Successful Bachelors Engineering degree in Computer Science or a related field. 8+ years of experience of enterprise-class experience with large-scale cloud solutions in data science/analytics projects and engineering projects. Expert level experience in PowerBI, SQL and Snowflake Experience with data visualization, Python, Data Modeling and data storytelling. Experience architecting complex data marts applying DBT. Architect and build data solutions that use data quality and anomaly detection best practices. Experience building production analytics using the Snowflake data platform. Experience in AWS and Snowflake tools and services Good to have: Certificate in Snowflake is plus Relevant certifications in data warehousing or cloud platform. Experience architecting complex data marts applying DBT and Airflow.
Posted 6 days ago
6.0 - 12.0 years
6 - 12 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: DBT Data Engineer Key Responsibilities As a DBT Data Engineer, you will: SQL Development & Optimization: Apply strong SQL skills, including Common Table Expressions (CTE), for data manipulation and optimization. Database Management: Utilize Snowflake or equivalent strong database experience for data handling, including SQL tuning. DBT Implementation: Work with DBT Core or DBT Cloud for data transformation and modeling. Development Best Practices: Adhere to and promote development best practices, including peer reviews and unit testing. Data Modeling: Apply fundamental data modeling concepts such as 3NF, Star, Snowflake schemas, and understanding the grain of data. Data Analysis & Profiling: Perform data analysis and data profiling tasks to ensure data quality and understanding. Mandatory Skills & Experience Strong SQL skills , including Common Table Expressions (CTE). Experience with Snowflake or very strong database experience , including SQL tuning. Experience in DBT Core or DBT Cloud . Good understanding of development best practices, peer reviews, and unit testing . Understanding of fundamental data modeling concepts such as 3NF, Star, Snowflake schemas, and the grain of data. Experience in Data Analysis and Data Profiling tasks.
Posted 6 days ago
8.0 - 13.0 years
3 - 11 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
What Your Responsibilities Will Be Avalara is looking for data analytics engineer who can solve and scale real world big data challenges. Have end to end analytics experience and a complex data story with data models and reliable and applicable metrics. Build and deploy data science models using complex SQL, Python, DBT data modelling and re-useable visualization components (PowerBI/Tableau/Hex/R-shiny etc.) Expert level experience in PowerBI, SQL and Snowflake Solve needs on a large scale by applying your software engineering and complex data. Lead and help develop a roadmap for the area and the team. Analyze fault tolerance and high availability issues, performance, and scale challenges, and solve them. Lead programs and collaborate with engineers, product managers, and technical program managers across teams. Understand the trade-offs between consistency, durability, and costs to build solutions that can meet the demands of growing services. Ensure the operational readiness of the services and meet the commitments to our customers regarding availability and performance. Manage end-to-end project plans and ensure on-time delivery. Communicate the status and big picture to the project team and management. Work with business and engineering teams to identify scope, constraints, dependencies, and risks. Identify risks and opportunities across the business and guide solutions. What Youll Need to be Successful What Youll Need to be Successful Bachelors Engineering degree in Computer Science or a related field. 8+ years of experience of enterprise-class experience with large-scale cloud solutions in data science/analytics projects and engineering projects. Expert level experience in PowerBI, SQL and Snowflake Experience with data visualization, Python, Data Modeling and data storytelling. Experience architecting complex data marts applying DBT. Architect and build data solutions that use data quality and anomaly detection best practices. Experience building production analytics using the Snowflake data platform. Experience in AWS and Snowflake tools and services Good to have: Certificate in Snowflake is plus Relevant certifications in data warehousing or cloud platform. Experience architecting complex data marts applying DBT and Airflow.
Posted 6 days ago
5.0 - 10.0 years
5 - 10 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Senior Data Engineer Key Responsibilities As a Senior Data Engineer, you will: Data Pipeline Development: Design, build, and maintain scalable data pipelines using PySpark and Python. AWS Cloud Integration: Work with AWS cloud services (S3, Lambda, Glue, EMR, Redshift) for data ingestion, processing, and storage. ETL Workflow Management: Implement and maintain ETL workflows using DBT and orchestration tools (e.g., Airflow). Data Warehousing: Design and manage data models in Snowflake, ensuring performance and reliability. SQL Optimization: Utilize SQL for querying and optimizing datasets across different databases. Data Integration: Integrate and manage data from MongoDB, Kafka, and other streaming or NoSQL sources. Collaboration & Support: Collaborate with data scientists, analysts, and other engineers to support advanced analytics and Machine Learning (ML) initiatives. Data Quality & Governance: Ensure data quality, lineage, and governance through best practices and tools. Mandatory Skills & Experience Strong programming skills in Python and PySpark . Hands-on experience with AWS data services (S3, Lambda, Glue, EMR, Redshift). Proficiency in SQL and experience with DBT for data transformation. Experience with Snowflake for data warehousing. Knowledge of MongoDB , Kafka , and data streaming concepts. Good understanding of data architecture, data modeling, and data governance . Familiarity with large-scale data platforms. Essential Professional Skills Excellent problem-solving skills . Ability to work independently or as part of a team . Experience with CI/CD and DevOps practices in a data engineering environment (Plus). Qualifications Proven hands-on experience working with large-scale data platforms . Strong background in Python, PySpark, AWS , and modern data warehousing tools such as Snowflake and DBT . Familiarity with NoSQL databases like MongoDB and real-time streaming platforms like Kafka.
Posted 6 days ago
8.0 - 10.0 years
1 Lacs
Chennai, Tamil Nadu, India
On-site
As a Solution Architect Snowflake, you will play a crucial role in designing, building, and maintaining scalable data pipelines and infrastructure. You will work with a variety of technologies, including Scala, Python, Spark, AWS services, and SQL, to support our data processing and analytics needs. Responsibilities: - Collaborate with stakeholders to finalize the scope of enhancements and development projects, gather detailed requirements. Apply expertise in ETL/ELT processes and tools to design and implement data pipelines that fulfil business requirements. Provide expertise as a technical resource to solve complex business issues that translate into data integration and database systems designs. Migrate and modernize existing legacy ETL jobs for Snowflake, ensure data integrity and optimal performance. Analyze existing ETL jobs and identify opportunities for creating reusable patterns and components to expedite future development. Develop and implement a configuration-driven Data Ingestion framework that enables efficient onboarding of new source tables. Collaborate with cross-functional teams, including business analysts and solution architects, to align data engineering initiatives with business goals. Drive continuous improvement initiatives, enhance data engineering processes, tools, and frameworks. Ensure compliance with data quality, security, and privacy standards across all data engineering activities. Participate in code reviews, provide constructive feedback, and ensure high-quality, maintainable code. Prepare and present technical documentation, including data flow diagrams, ETL specifications, and architectural designs. Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Cloud certifications (AWS, etc.) and relevant technical certification in multiple technologies is desirable. Skills: Mandatory Technical Skills:- Should have strong experience in Snowflake and must have executed development and migration projects involving Snowflake Should have strong working experience in ETL tools (Matillion/ DBT/Fivetron/ADF preferably) Experience in SQL writing including flatten tables and experience in JSON will be good to have, and able to write complex queries. Strong understanding of SQL queries, good coding experience on Python, deploying into Snowflake data warehousing, pipelines Experience in large databases Working knowledge of AWS (S3, KMS, and more) or Azure/GCP Design, develop, and thoroughly test new ETL/ELT code, ensure accuracy, reliability, and adherence to best practices Snowflake Python/Spark/JavaScript AWS/Azure/GCP SQL Good to have skills:- CI/CD (DevOps)
Posted 6 days ago
7.0 - 8.0 years
1 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
As a Technical Lead Azure Snowflake DBT, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards. Responsibilities: - Design, develop, and maintain data processing systems using Azure Snowflake. Design and develop robust data integration solutions using Data Build Tool (DBT) and other data pipeline tools. Work with complex SQL functions and transform large data sets to meet business requirements. Drive creation and maintenance of data models that support analytics use cases and business objectives. Collaborate with various stakeholders, including technical teams, functional SMEs, and business users, to understand and address the data needs. Create low-level design documents and unit test strategies and plans in adherence to defined processes and guidelines. Perform code reviews and unit test plan reviews to ensure high quality of code and deliverables. Ensure data quality and integrity through validation, cleansing, and enrichment processes. Support end-to-end testing and validation, including UAT and product testing. Take ownership of problems, demonstrate a proactive approach to problem solving, and Lead solutions to completion. Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: Mandatory Technical Skills:- Over 5 years of experience in Cloud data architecture and analytics Proficient in Azure, Snowflake, SQL, and DBT Extensive experience in designing and developing data integration solutions using DBT and other data pipeline tools Excellent communication and teamwork skills Self-initiated, problem solver with a strong sense of ownership Good to Have Skills: Experience in other data processing tools and technologies Familiarity with agile development methodologies Strong analytical and problem-solving skills Experience in the healthcare domain
Posted 6 days ago
8.0 - 10.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Cloud Solution Delivery Lead Consultant to join our team in bangalore, Karn?taka (IN-KA), India (IN). Data Engineer Lead Robust hands-on experience with industry standard tooling and techniques, including SQL, Git and CI/CD pipelines mandiroty Management, administration, and maintenance with data streaming tools such as Kafka/Confluent Kafka, Flink Experienced with software support for applications written in Python & SQL Administration, configuration and maintenance of Snowflake & DBT Experience with data product environments that use tools such as Kafka Connect, Synk, Confluent Schema Registry, Atlan, IBM MQ, Sonarcube, Apache Airflow, Apache Iceberg, Dynamo DB, Terraform and GitHub Debugging issues, root cause analysis, and applying fixes Management and maintenance of ETL processes (bug fixing and batch job monitoring) Training & Certification . Apache Kafka Administration Snowflake Fundamentals/Advanced Training . Experience 8 years of experience in a technical role working with AWS At least 2 years in a leadership or management role About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 1 week ago
0.0 - 5.0 years
0 - 5 Lacs
Pune, Maharashtra, India
On-site
As a Data Engineer at IBM , you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. Your Role and Responsibilities In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintaining statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques. Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements. Work in an Agile, collaborative environment , partnering with other scientists, engineers, consultants, and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results. Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake , including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modeling techniques to support analytics and reporting requirements. Preferred Technical and Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks.
Posted 1 week ago
0.0 - 5.0 years
0 - 5 Lacs
Navi Mumbai, Maharashtra, India
On-site
Role Overview In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers) , where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role and Responsibilities Data Strategy and Planning : Develop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data Modeling : Design and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and Management : Oversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data Integration : Define and implement data integration strategies to facilitate seamless flow of information across systems. Responsibilities: Experience in data architecture and engineering. Proven expertise with Snowflake data platform . Strong understanding of ETL/ELT processes and data integration . Experience with data modeling and data warehousing concepts. Familiarity with performance tuning and optimization techniques. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise Cloud & Data Architecture : AWS, Snowflake ETL & Data Engineering : AWS Glue, Apache Spark, Step Functions Big Data & Analytics : Athena, Presto, Hadoop Database & Storage : SQL, Snow SQL Security & Compliance : IAM, KMS, Data Masking Preferred Technical and Professional Experience Cloud Data Warehousing : Snowflake (Data Modeling, Query Optimization) Data Transformation : DBT (Data Build Tool) for ELT pipeline management Metadata & Data Governance : Alation (Data Catalog, Lineage, Governance)
Posted 1 week ago
5.0 - 7.0 years
5 - 7 Lacs
Pune, Maharashtra, India
On-site
Required technical and professional expertise 5+ years of experience with BI tools, with expertise and/or certification in at least one major BI platform Tableau preferred. Advanced knowledge of SQL, including the ability to write complex stored procedures, views, and functions. Proven capability in data storytelling and visualization, delivering actionable insights through compelling presentations. Excellent communication skills, with the ability to convey complex analytical findings to non-technical stakeholders in a clear, concise, and meaningful way. 5.Identifying and analyzing industry trends, geographic variations, competitor strategies, and emerging customer behavior Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships)
Posted 1 week ago
0.0 - 5.0 years
0 - 5 Lacs
Pune, Maharashtra, India
On-site
As a Data Engineer at IBM , you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. Your Role and Responsibilities In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintaining statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques. Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements. Work in an Agile, collaborative environment , partnering with other scientists, engineers, consultants, and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results. Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake , including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modeling techniques to support analytics and reporting requirements. Preferred Technical and Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks.
Posted 1 week ago
3.0 - 7.0 years
3 - 7 Lacs
Mumbai, Maharashtra, India
On-site
Job description A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A senior leadership role that entails the oversight of multiple teams or a substantial team of data platform engineers, the management of intricate data infrastructure projects, and the making of strategic decisions that shape technological direction within the realm of data platform engineering.Key responsibilities encompass:Strategic Leadership: Leading multiple data platform engineering teams, steering substantial projects, and setting the strategic course for data platform development and operations.Complex Project Management: Supervising the execution of intricate data infrastructure projects, ensuring alignment with cliental objectives and the delivery of value.Technical and Strategic Decision-Making: Making well-informed decisions concerning data platform architecture, tools, and processes. Balancing technical considerations with broader business goals.Influencing Technical Direction: Utilising their profound technical expertise in data platform engineering to influence the direction of the team and the client, driving enhancements in data platform technologies and processes.Innovation and Contribution to the Discipline: Serving as innovators and influencers within the field of data platform engineering, contributing to the advancement of the discipline through thought leadership and the sharing of knowledge.Leadership and Mentorship: Offering mentorship and guidance to both managers and technical personnel, cultivating a culture of excellence and innovation within the domain of data platform engineering.
Posted 1 week ago
0.0 - 5.0 years
0 - 5 Lacs
Pune, Maharashtra, India
On-site
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client's environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases.Troubleshooting capabilities to debug Data controls Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL
Posted 1 week ago
3.0 - 10.0 years
1 - 2 Lacs
Pune, Maharashtra, India
On-site
Our client is an EU subsidiary of a Global Financial Bank working in multiple markets and asset classes. DWH / ETL Tester will work closely with the Development Team to design, build interfaces and integrate data from a variety from internal and external data sources into the new Enterprise Data Warehouse environment. The ETL Tester will be primarily responsible for testing Enterprise Data Warehouse using Automation within industry recognized ETL standards, architecture, and best practices. Responsibilities Perform intake of new ETL Project & initiatives, make the high-level assessment in collaboration with the leadership of the roadmap. Design Test Strategy and Test Plan to address the needs of Cloud Based ETL Pipelines. Contribute and manage Testing Deliverables. Ensure the implementation of test standards and best practices for the agile model & contributes to their development. Engage with internal stakeholders in various areas of the organization to seek alignment and collaboration. Deals with external stakeholders / Vendors. Identify risks / issues and present associated mitigating actions taking into account the critically of the domain of the underlying business. Contribute to continuous improvement of testing standard processes. Skills Expert level knowledge on Data Warehouse, RDBMS concepts. Expertise on new age cloud-based Data Warehouse solutions ADF, SnowFlake, GCP etc. Hands-On expertise in writing complex SQL using multiple JOINS and highly complex functions to test various transformations and ETL requirements. Knowledge and Experience on creating Test Automation for Database and ETL Testing Regression Suite. Automation using Selenium with Python (or Java Script), Python Scripts, Shell Script. Knowledge of framework designing, REST API Testing of databases using Python. Experience using Atlassian tool set, Azure DevOps. Experience in Code & Version Management GIT, Bitbucket, Azure Repos etc. Qualifications A bachelor's degree or equivalent experience in computer science or similar. Experience in crafting test strategies and supervising ETL DWH test activities on multi-platform & sophisticated Cloud based environments. Strong analytical mind-set with the ability to extract relevant information from documentation, system data, clients and colleagues and analyze the captured information. ISTQB Foundation Certificate in Software testing Optional/Preferred experience in the financial industry, knowledge of Regulatory Reporting and the terms/terminology used. Important to Have Proficiency in English read/write/speak. Able to demonstrate your ability to learn new technologies. Able to easily adapt to new circumstances / technologies / procedures. Stress resistant and constructive whatever the context. Able to align with existing standards and acting with attention to detail A true standout colleague who demonstrates good interpersonal skills. Able to summarize complex technical situations in simple terms. Solution and customer focused. Good communication skills, a positive attitude, and a competitive, but team-oriented focus are key elements to be successful in this challenging environment. Nice to have Experience in the financial industry, knowledge of Regulatory Reporting and the terms/terminology used.
Posted 1 week ago
4.0 - 8.0 years
4 - 8 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data Architecture: AWS ,Snowflake ETL & Data Engineering: AWS Glue, Apache Spark, Step Functions Big Data & Analytics: Athena,Presto, Hadoop Database & Storage: SQL,Snow sql Security & Compliance: IAM, KMS, Data Masking Preferred technical and professional experience Cloud Data Warehousing: Snowflake (Data Modeling, Query Optimization) Data Transformation: DBT (Data Build Tool) for ELT pipeline management Metadata & Data Governance: Alation (Data Catalog, Lineage, Governance)
Posted 1 week ago
5.0 - 15.0 years
22 - 24 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Bengaluru, Chennai, Gurgaon JobType: full-time We are looking for an experiencedSnowflake Developerto join our Data Engineering team. The ideal candidate will possess a deep understanding ofData Warehousing,SQL,ETL tools like Informatica, andvisualization platforms such as Power BI. This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing:Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development:Design and maintain ELT pipelines usingSnowflake,Fivetran, andDBTto ingest and transform data from multiple sources. SQL Development:Write and optimize complexSQL queriesandstored proceduresto support robust data transformations and analytics. Data Modeling & ELT:Implement advanced data modeling practices includingSCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis:Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting:Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation:Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise inSnowflakefor large-scale data warehousing and ELT operations. StrongSQLskills with the ability to create and manage complex queries and procedures. Proven experience withInformatica PowerCenterfor ETL development. Proficiency withPower BIfor data visualization and reporting. Hands-on experience withFivetranfor automated data integration. Familiarity withDBT,Sigma Computing,Tableau, andOracle. Solid understanding ofdata analysis,requirement gathering, andsource-to-target mapping. Knowledge of cloud ecosystems such asAzure (including ADF, Databricks); experience withAWS or GCPis a plus. Experience with workflow orchestration tools likeAirflow,Azkaban, orLuigi. Proficiency inPythonfor scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree inComputer Science,Statistics,Informatics,Information Systems, or a related field. Key Tools & Technologies Snowflake,snowsql,Snowpark SQL,Informatica,Power BI,DBT Python,Fivetran,Sigma Computing,Tableau Airflow,Azkaban,Azure,Databricks,ADF
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2